# Low Perplexity
Duo Distilled
Apache-2.0
DUO is a pre-trained model for text generation, which can be used for masked language modeling tasks. It is trained on the OpenWebText corpus and has good performance.
Large Language Model
Transformers English

D
s-sahoo
98.21k
1
RWKV7 Goose World3 1.5B HF
Apache-2.0
The RWKV-7 model in flash-linear attention format, supporting English text generation tasks.
Large Language Model English
R
RWKV
70
2
Open Calm 3b
OpenCALM is a 3B-parameter version of the decoder-only language model series developed by CyberAgent, pretrained on Japanese datasets.
Large Language Model
Transformers Japanese

O
cyberagent
850
20
Sanberta
SanBERTa is a RoBERTa model trained on Sanskrit text, specifically designed for Sanskrit text processing tasks.
Large Language Model Other
S
surajp
15
2
Rugpt3large Based On Gpt2
Large-scale Russian pre-trained Transformer language model based on GPT-2 architecture, trained by SberDevices team
Large Language Model Other
R
ai-forever
9,985
86
Melayubert
MIT
A Malay masked language model based on the BERT architecture, trained on the Malay subset of the OSCAR dataset, supporting PyTorch and TensorFlow frameworks.
Large Language Model
Transformers Other

M
StevenLimcorn
15
0
Gpt Fr Cased Small
Apache-2.0
GPT-fr is a French GPT model developed by Quantmetry and Laboratoire de Linguistique Formelle (LLF), trained on a large and diverse corpus of French texts.
Large Language Model French
G
asi
4,314
8
Featured Recommended AI Models